What is it that Makes a Microsoft Executable a Microsoft Executable? An Attacker’s and a Defender’s Perspective

Matt Graeber
Posts By SpecterOps Team Members
11 min readJul 6, 2018

--

What exactly is it that separates arbitrary code from code that originates from Microsoft? I would wager that the reaction of most people would be to claim, “well… if it’s signed by Microsoft, then it comes from Microsoft. What else is there to talk about?”

I pose this question because it has come up countless times performing threat hunting assessments where event volume is so great that a top defender priority is to filter out the noise of known good binaries/behaviors. In doing so (theoretically speaking), the only events that would remain are that of suspicious events (i.e. any event not explicitly classified as benign — untrusted, malicious, potentially unwanted, etc.) in need of triage. One must approach this process, however, with extreme caution as evasive attackers will always consider methods to blend in with binaries and behaviors that are naively assumed to be benign.

How might one go about classifying a Microsoft executable as benign? Well, in order to define such a process, it helps to understand how an attacker might attempt to pose as a Microsoft executable. Let’s consider the following evasive attack scenarios:

1. A signed, abusable EXE was copied to another directory, renamed to a benign sounding filename and executed to evade naïve command-line detections.

Consider the following scenario: A purposely evasive attacker wants to execute malicious PowerShell code and evade command line logging. They know that defenders can easily detect the presence of “powershell” and all possible variations of “-Command” and “-EncodedCommand” on the command line, so, they copy powershell.exe to an attacker-controlled directory and renames it as notepad.exe. A more mature defender is wise to this technique and their detections only consider the variations of “-Command” and “-EncodedCommand” on the command line. To counter this, the resourceful attacker drops a malicious PowerShell profile where all it does is read PowerShell code from a benign-sounding .DAT file. Doing this achieves two things:

  1. The attacker executes their malicious PowerShell payload as notepad.exe with no command-line arguments that defenders are likely looking for.
  2. Traditional anti-virus is more likely to place additional scrutiny on .PS1 files than .DAT files (or any other arbitrary file extension).

The high-level goal of detection in this scenario would be the following:

Detect the use of known, abusable, signed applications that deviate from their expected path and filename.

It is also important to distill the attacker the attacker techniques used in this scenario:

  1. A known, abusable, signed application (powershell.exe in this case), was renamed to something besides its expected filename and executed.
  2. A known, abusable, signed application (powershell.exe in this case), was executed from a directory outside of its expected directory.
  3. A PowerShell-specific command-line evasion technique was employed — use of malicious PowerShell profile.

You may already be thinking that PowerShell v5 scriptblock logging and “Windows PowerShell” event ID 400 logs would capture this evasive tradecraft and you would be right. The purpose of this scenario was to highlight the evasive action of renaming and copying executables independent of the specific vendor.

2. An attacker attempts to “blend in with normal” by dropping a malicious executable that gives off the appearance that it is a benign, Microsoft executable.

Consider the following scenario: You gain access to a machine as a standard user and want to persist a malicious executable. Where’s a good place to persist such that you blend in with what would otherwise look normal? Well, this would be a case where perhaps the attacker would have done their homework and used Autoruns for malicious purposes. Instead of using it to find potential evil, they use it to identify places to hide. So, they look for all built-in, out of the box persistence entries that point to a binary that resides in a directory that they can control. OneDrive is a great candidate as it is persisted as a run key by default and it resides in the user’s %APPDATA% directory. So, the attacker simply replaces OneDrive.exe with their malicious executable, leaving the existing run key unchanged.

HKCU run key entry for OneDrive which resides in the user’s AppData directory

The high-level goal of detection in this scenario would be the following:

For a common persistence entry, what are the attributes of the executable that make it normal? If “normal” can be established, then a deviation from normal should be considered suspect.

Let’s distill the attacker techniques used in this scenario:

  1. The OneDrive executable that is expected to be present in that directory originates from Microsoft. The binary supplied by the attacker does not originate from Microsoft.
  2. Rather than creating a new run key, the attacker replaced the binary pointed to by an existing, legitimate run key.

3. An attacker backdoors a system executable.

In this scenario, an attacker backdoors a system executable using a tool like The Backdoor Factory.

The following goal should be considered from a detection standpoint:

A system executable is expected to reside in a particular directory and should have a valid Microsoft signature. Any deviation should be considered suspicious.

4. An attacker spoofs a Microsoft digital signature and signs their malicious code with a “Microsoft” certificate.

An evasive attacker can apply a legitimate Microsoft digital signature to their malicious code and force it to become valid by performing a SIP/trust provider hijack attack (requires elevation). They could also craft a certificate chain that has the look and feel of a Microsoft certificate chain and explicitly trust their fake “Microsoft” root CA on the victim system.

The high-level goal of detection in this scenario would be the following:

Detect a change to the registry values responsible for code signing validation, identify root CAs worthy of trust, and alert upon a deviation from an expected root CA for a given binary.

Common Defender Assumptions

Given these particularly evasive attack scenarios, it is important to be honest with oneself and admit to some common defenders assumptions, otherwise, there will be no hope for continual improvement in the presence of mature attackers. Also, as Dane Stuckey eloquently put it, “detection engineering is a journey that never ends. If you allow yourself to reach your destination, you’ve already failed.”

  • A defender may not consider that an attacker might rename an executable or consider the rationale for doing so.
  • A defender may not consider the attacker rationale for copying an executable to another directory.
  • An assumption may be that an attacker would be more likely to establish a new persistence technique over hijacking an existing, legitimate one.
  • A defender may not consider or have the optics to detect the loading of an unsigned system executable (or one with an invalid signature) that should otherwise have a validated signature.
  • Defenders are likely to implicitly trust the output of signature validation utilities and not consider the meaning/implications of root CA trust.

Now, identifying a problem isn’t very useful without a proposed solution. What follows will be a discussion of the attributes of an executable that should be considered in addressing and overcoming these highlighted assumptions.

Defining the features of a benign Microsoft executable

So from a defender’s perspective, what attributes of an executable can we use to classify a given Microsoft executable as a proper Microsoft executable? I propose the following:

1.The original path of the executable.

For example, it should be noted that notepad.exe is expected to reside in %windir%, %windir%\System32, and %windir%\SysWOW64. Any deviation from those paths should be considered suspicious. Also note that I refer to the Windows directory by its environment variable. Never assume the boot partition is “C”.

2. The expected filename of the executable.

Any deviation from the expected filename should be considered suspect.

3. The original filename of the executable.

The original filename is present in the “Version Info” resource of nearly all signed PE files. It is what you see when you look at the “Details” tab of a PE file’s properties.

Note that original filename will not always be the same as the expected filename on disk. What’s nice about original filename though is that any attempt to modify it will invalidate the signature of the binary. This is how Windows Defender Application Control (WDAC) blocks individual files.

4. The file description of the executable.

This is another field present in the “Version Info” resource that can be considered in addition to or in place of the original filename. There may be some cases where original filename may not be present so the file description should be considered in its place. This was the case with fsi.exe — an abusable binary discovered by Matt Nelson that originally could not be effectively blocked by WDAC until Microsoft added the ability to block by other “Version Info” properties.

5. The signature status of the executable.

For a binary that is expected to be signed, any signature validation status other than “valid” should be considered suspect. The Get-AuthenticodeSignature cmdlet supports the following values for reference:

  • HashMismatch — indicates that the integrity of the executable is compromised
  • Incompatible
  • NotSigned
  • NotSupportedFileFormat
  • NotTrusted — indicates that the signer’s certificate was either revoked or explicitly marked as disallowed.
  • Valid — indicates that the executable’s integrity was validated and that the certificate chain properly chains to a trusted root CA.

It should be noted that it is a common misconception that if a PE file doesn’t have a “Digital Signatures” tab that it’s not signed. Nearly all code built-in to Windows is signed, the majority of which is catalog-signed. This can certainly be problematic if signature validation is performed somewhere besides the target system, a challenge articulated by VirusTotal.

6. Is the file an in-box Windows binary?

Any code that ships with the operating system will be Windows-signed, indicated by the presence of a Microsoft digital signature that includes the “ Windows System Component Verification” EKU property (OID — 1.3.6.1.4.1.311.10.3.6). The “IsOSBinary” property of Get-AuthenticodeSignature will return true if the signature is valid, is Windows-signed and is rooted to a small, trusted set of Microsoft root CAs.

7. Signer subject and thumbprint

The subject field of a certificate is used to identify the organization to which a certificate was issued. The thumbprint (i.e. the SHA1 hash of the certificate) is used to assert the integrity of the certificate. There is a large set of Microsoft signing certificates. This field, from a detection perspective should be used only for informational purposes. Most accurately, code that is signed by Microsoft is signed by any certificate that was issued by one of a small set of root CA certificates.

8. Root issuer subject and thumbprint

A signature is only as trustworthy as the root CA from which it was issued. Attackers can easily trust a malicious root CA certificate on a victim system which is why it’s important to identify the small set of Microsoft root CAs when performing signature validation. They consist of the following thumbprint/subject values:

  • CDD4EEAE6000AC7F40C3802C171E30148030C072 — CN=Microsoft Root Certificate Authority, DC=microsoft, DC=com
  • A43489159A520F0D93D032CCAF37E7FE20A8B419 — CN=Microsoft Root Authority, OU=Microsoft Corporation, OU=Copyright © 1997 Microsoft Corp.
  • 3B1EFD3A66EA28B16697394703A72CA340A05BD5 — CN=Microsoft Root Certificate Authority 2010, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
  • 8F43288AD272F3103B6FB1428485EA3014C0BCFE — CN=Microsoft Root Certificate Authority 2011, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
  • (Optional, if you want to trust Windows Insider Preview code) F8DB7E1C16F1FFD4AAAD4AAD8DFF0F2445184AEB — CN=Microsoft Development Root Certificate Authority 2014, O=Microsoft Corporation, L=Redmond, S=Washington, C=US

Don’t trust me though! Go validate yourself by comparing against authroot.stl or usingsigcheck -tv and sigcheck -tuv.

Establishing a baselining procedure

All of the relevant properties highlighted above would need to be first baselined on a set of clean systems — a standard enterprise gold image would be ideal. How you decide to capture these properties is up to you. I wrote this PowerShell script to obtain all the relevant information from a gold image:

GetPEFeature.ps1 — a PowerShell script to extract relevant features from a PE file

The script would produce the following example output:

Sample output of the Get-PEFeature function

These attributes are what collectively constitute the attributes of benign Microsoft executables, where any deviation of these attributes in many cases should be considered suspect. Note that “ExpectedPath” is likely to be the most false positive-prone property.

Application of baselining results

Assuming you collected a baseline of benign Microsoft PE attributes, let’s apply some detections to the evasive attack scenarios covered earlier:

1. A signed, abusable EXE was copied to another directory, renamed to a benign sounding filename and executed to evade naïve command-line detections.

Having already established a definition for “expected PowerShell,” powershell.exe would be expected to have the following traits:

ExpectedPath     : %windir%\System32\WindowsPowerShell\v1.0
ExpectedFileName : powershell.exe
OriginalFileName : PowerShell.EXE.MUI
FileDescription : Windows PowerShell
SigningStatus : Valid
IsOSBinary : True
SignerSubject : CN=Microsoft Windows, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
SignerThumbprint : 419E77AED546A1A6CF4DC23C1F977542FE289CF7
RootSubject : CN=Microsoft Root Certificate Authority 2010, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
RootThumbprint : 3B1EFD3A66EA28B16697394703A72CA340A05BD5

So with powershell.exe renamed to notepad.exe and executed from a non-standard directory, how would we know that it is actually powershell.exe without relying upon file hashing? Well, we know that because the renamed notepad.exe would have an OriginalFileName of “PowerShell.EXE.MUI” and it would have a valid, Microsoft signature (i.e. has its integrity intact and chains to a known Microsoft root certificate). Both the ExpectedPath and ExpectedFileName fields were deviated from and an alert should be generated.

2. An attacker attempts to “blend in with normal” by dropping a malicious executable that gives off the appearance that it is a benign, Microsoft executable.

Binaries for known prevalent persistent mechanism should be baselined. Assuming that was performed, it should be expected that onedrive.exe have the following properties:

ExpectedPath     : %LOCALAPPDATA%\Microsoft\OneDrive
ExpectedFileName : OneDrive.exe
OriginalFileName : OneDrive.exe
FileDescription : Microsoft OneDrive
SigningStatus : Valid
IsOSBinary : False
SignerSubject : CN=Microsoft Corporation, OU=MOPR, O=Microsoft Corporation, L=Redmond, S=Washington, C=US
SignerThumbprint : 5EAD300DC7E4D637948ECB0ED829A072BD152E17
RootSubject : CN=Microsoft Root Certificate Authority, DC=microsoft, DC=com
RootThumbprint : CDD4EEAE6000AC7F40C3802C171E30148030C072

In this case, the malicious executable dropped would retain the ExpectedFilePath and ExpectedFileName properties but none of the other properties.

3. An attacker backdoors a system executable.

Similar to the previous scenario, the backdoored executable would retain the ExpectedFilePath, ExpectedFileName, and FileDescription properties but none of the other properties.

4. An attacker spoofs a Microsoft digital signature and signs their malicious code with a “Microsoft” certificate.

In the case of an attacker signing their malware with a certificate chain made to look like a Microsoft certificate chain, a SignerSubject and/or RootSubject field containing “Microsoft” that does not have one of the whitelisted RootThumbprint values should be considered extremely suspect.

Additional proposed attributes

One attribute that you might consider adding manually to a baselined list would be a “KnownAbused” attribute that indicates that it is a binary that is known to be abused by attackers. This would allow you to cluster related “living off the land”-style detections together. A good initial reference for what binaries to mark as known abused would be to monitor the canonical WDAC blacklist rule set from Microsoft.

If you were to track known abused software, you could then also consider adding a fields which consists of a set of command-line switches for the given executable that are difficult for an attacker to evade.

Conclusion

Given the threat of intentionally evasive tradecraft, it is important to not only know how to recognize, for example, what it is that makes powershell.exe powershell.exe, but it is also important to have a practical means in which to make such determinations and observe deviations.

I hope to have convinced you of the importance of the PE attributes highlighted in this article and to request of your endpoint security vendor optics into these attributes if you don’t have them already.

--

--